MALT - a Multi-lingual Adaptive Language Tutor

نویسندگان

  • Matthias Scheutz
  • Michael Heilman
  • Aaron Wenger
  • Colleen Ryan-Scheutz
چکیده

We describe a “Multi-lingual Adaptive Language Tutor” (MALT) that uses natural language parsing and text generation to create various kinds of grammar exercises for learners of any language. These exercises can be restricted to specific topics by the instructor such as transformation of verb tenses. MALT generates novel exercises focusing on the specific difficulties of language learners as determined from their past mistakes, helping them overcome individual difficulties faster. We also present the first preliminary results from employing MALT in the foreign language classroom at Notre Dame.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

HMM-based polyglot speech synthesis by speaker and language adaptive training

This paper describes a technique for speaker and language adaptive training (SLAT) for HMM-based polyglot speech synthesis and its evaluations on a multi-lingual speech corpus. The SLAT technique allows multi-speaker/multi-language adaptive training and synthesis to be performed. Experimental results show that the SLAT technique achieves better naturalness than both speaker-adaptively trained l...

متن کامل

Multi-lingual phoneme recognition exploiting acoustic-phonetic similarities of sounds

The aim of this work is to exploit the acoustic-phonetic similarities between several languages. In recent work cross{ language HMM-based phoneme models have been used only for bootstrapping the language{dependent models and the multi{lingual approach has been investigated only on very small speech corpora. In this paper, we introduce a statistical distance measure to determine the similarities...

متن کامل

Cross-lingual adaptation with multi-task adaptive networks

Posterior-based or bottleneck features derived from neural networks trained on out-of-domain data may be successfully applied to improve speech recognition performance when data is scarce for the target domain or language. In this paper we combine this approach with the use of a hierarchical deep neural network (DNN) network structure – which we term a multi-level adaptive network (MLAN) – and ...

متن کامل

Waterloo at NTCIR-3: Using Self-supervised Word Segmentation

In this paper, we describe the system we use in the NTCIR-3 CLIR (cross language IR) task. We participate the SLIR (single language IR) track. In our system, we use a self-supervised word-segmentation technique for Chinese information retrieval, which combines the advantages of traditional dictionary based approaches with character based approaches, while overcoming many of their shortcomings. ...

متن کامل

Semi-automatic test generation for tandem learning

We introduce a Web-based CALL architecture that facilitates the construction of learner-customized multiple choice tests in a cross-lingual tandem language learning environment. Mistakes made by the learner are manually corrected and classified by his tandem partner, who acts as a tutor. If the learner has problems to identify and correct his mistakes, or if he likes to practice, he can generat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005